PAC-Bayesian Margin Bounds for Convolutional Neural Networks - Technical Report
نویسندگان
چکیده
Recently the generalisation error of deep neural networks has been analysed through the PAC-Bayesian framework, for the case of fully connected layers. We adapt this approach to the convolutional setting.
منابع مشابه
A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks
We present a generalization bound for feedforward neural networks in terms of the product of the spectral norms of the layers and the Frobenius norm of the weights. The generalization bound is derived using a PAC-Bayes analysis.
متن کاملOn PAC-Bayesian Margin Bounds
Over the past few years, progress has been made in obtaining dimension independent margin bounds. In this note, we revisit the PAC-Bayesian margin bounds proposed by Langford and Shawe-Taylor [4] and later refined by McAllester [6]. In addition to simplifying some of the existing arguments, we use a tighter tail bound on the normal distribution to give an explicit margin bound that is a mild va...
متن کاملSimplified PAC-Bayesian Margin Bounds
The theoretical understanding of support vector machines is largely based on margin bounds for linear classifiers with unit-norm weight vectors and unit-norm feature vectors. Unit-norm margin bounds have been proved previously using fat-shattering arguments and Rademacher complexity. Recently Langford and Shawe-Taylor proved a dimensionindependent unit-norm margin bound using a relatively simpl...
متن کاملGeneralisation Error Bounds for Sparse Linear Classi ers
We provide small sample size bounds on the generalisation error of linear classiiers that are sparse in their dual representation given by the expansion coeecients of the weight vector in terms of the training data. These results theoretically justify algorithms like the Support Vector Machine, the Relevance Vector Machine and K-nearest-neighbour. The bounds are a-posteriori bounds to be evalua...
متن کاملA PAC-Bayesian Tutorial with A Dropout Bound
This tutorial gives a concise overview of existing PAC-Bayesian theory focusing on three generalization bounds. The first is an Occam bound which handles rules with finite precision parameters and which states that generalization loss is near training loss when the number of bits needed to write the rule is small compared to the sample size. The second is a PAC-Bayesian bound providing a genera...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1801.00171 شماره
صفحات -
تاریخ انتشار 2017